Skip to content

Conversation

andrewrosemberg
Copy link
Collaborator

@andrewrosemberg andrewrosemberg commented Feb 22, 2025

Pull Request Summary

Title: Objective Sensitivity

Description: This pull request implements functionality for obtaining the objective sensitivity with respect to parameters and the equivalent for reverse mode in the DiffOpt.jl package.

  • Additions: 335 lines
  • Deletions: 35 lines
  • Changed Files: 7
  • Commits: 11
  • Comments: 1
  • Review Comments: 4
  • State: Open
  • Mergeable: Yes (unstable)

Usage Example

# Always a good practice to clear previously set sensitivities
DiffOpt.empty_input_sensitivities!(model)

MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(p), Parameter(3.0))
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(p_c), Parameter(3.0))
DiffOpt.forward_differentiate!(model)

MOI.get(model, DiffOpt.ForwardObjectiveSensitivity())

In the backward mode, we can calculate the parameter perturbation with respect to the objective perturbation:

# Always a good practice to clear previously set sensitivities
DiffOpt.empty_input_sensitivities!(model)

MOI.set(
    model,
    DiffOpt.ReverseObjectiveSensitivity(),
    0.1,
)

DiffOpt.reverse_differentiate!(model)

MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(p))

TODOS:

  • Rebase
  • Add error for when the user is requesting it for other backends not yet implemented.
  • Create issues about implementing for other formulations

Copy link

codecov bot commented Feb 22, 2025

Codecov Report

❌ Patch coverage is 88.88889% with 8 lines in your changes missing coverage. Please review.
✅ Project coverage is 89.18%. Comparing base (93f058d) to head (a15f6f6).
⚠️ Report is 3 commits behind head on master.

Files with missing lines Patch % Lines
src/NonLinearProgram/NonLinearProgram.jl 83.33% 6 Missing ⚠️
src/diff_opt.jl 85.71% 1 Missing ⚠️
src/moi_wrapper.jl 88.88% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #282      +/-   ##
==========================================
+ Coverage   89.08%   89.18%   +0.10%     
==========================================
  Files          15       15              
  Lines        1969     2006      +37     
==========================================
+ Hits         1754     1789      +35     
- Misses        215      217       +2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@andrewrosemberg andrewrosemberg changed the title Implement dual of parameter anywhere [WIP] Objective Sensitivity Feb 25, 2025
@andrewrosemberg andrewrosemberg changed the title [WIP] Objective Sensitivity Objective Sensitivity Feb 27, 2025
@@ -199,6 +215,21 @@ struct ForwardConstraintDual <: MOI.AbstractConstraintAttribute end

MOI.is_set_by_optimize(::ForwardConstraintDual) = true
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

codecov thinks this line is untested, but it is used internally in the tests.

"""
struct ForwardObjectiveSensitivity <: MOI.AbstractModelAttribute end

MOI.is_set_by_optimize(::ForwardObjectiveSensitivity) = true
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dito codecov mistake

Copy link

@frapac frapac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this PR looks good to me, overall!

MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0)

# Compute derivatives
@test_throws ErrorException DiffOpt.reverse_differentiate!(model)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

@andrewrosemberg
Copy link
Collaborator Author

closing in favor of #303

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants